| Name | Version | Summary | date |
| mnistvit |
1.4.0 |
A vision transformer for training on MNIST |
2025-11-02 13:45:26 |
| lossless-yaml |
0.2.0 |
Yet Another YAML AST - programmatically transform YAML, preserving whitespace and comments |
2025-11-01 20:32:39 |
| trading-models |
0.1.6 |
MLP, CNN, Transformer models for time-series trading predictions. |
2025-11-01 20:17:15 |
| metnet |
4.1.19 |
PyTorch MetNet Implementation |
2025-11-01 09:09:53 |
| dreamer4 |
0.0.102 |
Dreamer 4 |
2025-10-30 15:46:39 |
| temporal-forecasting |
0.3.1 |
A transformer-based model for time series forecasting inspired by modern attention mechanisms |
2025-10-27 08:28:24 |
| fractal-attention-analysis |
1.0.0 |
Fractal-Attention Analysis (FAA) Framework for LLM Interpretability using Golden Ratio Transformations |
2025-10-26 00:09:48 |
| cell-decipher |
0.3.1 |
DECIPHER for learning disentangled cellular embeddings in large-scale heterogeneous spatial omics data |
2025-10-22 09:28:44 |
| hf-vram-calc |
1.0.8 |
GPU memory calculator for Hugging Face models with different data types and parallelization strategies |
2025-10-22 01:35:34 |
| haystack-ai |
2.19.0 |
LLM framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. |
2025-10-20 12:53:32 |
| gpt-db-core |
0.1.1 |
A Django + PyTorch neural database that learns, trains, and chats directly from your database. |
2025-10-20 02:07:55 |
| convai-innovations |
1.2.0 |
Interactive LLM Training Academy - Learn to build language models from scratch |
2025-10-19 08:30:04 |
| apdtflow |
0.2.1 |
APDTFlow: A modular forecasting framework for time series data |
2025-10-18 11:14:32 |
| litrwkv |
0.0.1a0 |
A clean RWKV language model implementation(Alpha Version - Under Development) |
2025-10-16 04:51:09 |
| alibabacloud-ros-tran |
0.20.0 |
Resource Orchestration Service Template Transformer. |
2025-10-16 01:38:20 |
| chunkformer |
1.2.1 |
ChunkFormer: Masked Chunking Conformer For Long-Form Speech Transcription |
2025-10-10 05:06:39 |
| megatron-core |
0.14.0 |
Megatron Core - a library for efficient and scalable training of transformer based models |
2025-10-08 15:04:33 |
| megatron-fsdp |
0.1.0 |
**Megatron-FSDP** is an NVIDIA-developed PyTorch extension that provides a high-performance implementation of Fully Sharded Data Parallelism (FSDP) |
2025-10-08 15:04:31 |
| arthemis-tts |
0.1.2 |
A simple transformer-based text-to-speech library |
2025-09-09 03:25:57 |
| fractale |
0.0.13 |
Jobspec specification and translation layer for cluster work |
2025-09-08 13:52:55 |